Numerical Experience with Limited-Memory Quasi-Newton and Truncated Newton Methods

نویسندگان

  • X. Zou
  • Ionel Michael Navon
  • M. Berger
  • Paul Kang-Hoh Phua
  • Tamar Schlick
  • François-Xavier Le Dimet
چکیده

Computational experience with several limited-memory quasi-Newton and truncated Newton methods for unconstrained nonlinear optimization is described. Comparative tests were conducted on a well-known test library [J. on several synthetic problems allowing control of the clustering of eigenvalues in the Hessian spectrum, and on some large-scale problems in oceanography and meteorology. The results indicate that among the tested limited-memory quasi-Newton methods, the L-BFGS method [D. best overall performance for the problems examined. The numerical performance of two truncated Newton methods, differing in the inner-loop solution for the search vector, is competitive with that of L-BFGS.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

New class of limited-memory variationally-derived variable metric methods

A new family of limited-memory variationally-derived variable metric or quasi-Newton methods for unconstrained minimization is given. The methods have quadratic termination property and use updates, invariant under linear transformations. Some encouraging numerical experience is reported.

متن کامل

Limited-memory projective variable metric methods for unconstrained minimization

A new family of limited-memory variable metric or quasi-Newton methods for unconstrained minimization is given. The methods are based on a positive definite inverse Hessian approximation in the form of the sum of identity matrix and two low rank matrices, obtained by the standard scaled Broyden class update. To reduce the rank of matrices, various projections are used. Numerical experience is e...

متن کامل

Dynamic scaling based preconditioning for truncated Newton methods in large scale unconstrained optimization

This paper deals with the preconditioning of truncated Newton methods for the solution of large scale nonlinear unconstrained optimization problems. We focus on preconditioners which can be naturally embedded in the framework of truncated Newton methods, i.e. which can be built without storing the Hessian matrix of the function to be minimized, but only based upon information on the Hessian obt...

متن کامل

A limited memory adaptive trust-region approach for large-scale unconstrained optimization

This study concerns with a trust-region-based method for solving unconstrained optimization problems. The approach takes the advantages of the compact limited memory BFGS updating formula together with an appropriate adaptive radius strategy. In our approach, the adaptive technique leads us to decrease the number of subproblems solving, while utilizing the structure of limited memory quasi-Newt...

متن کامل

Comparison of advanced large-scale minimization algorithms for the solution of inverse ill-posed problems

We compare the performance of several robust large-scale minimization algorithms for the unconstrained minimization of an ill-posed inverse problem. The parabolized Navier-Stokes equations model was used for adjoint parameter estimation. The methods compared consist of two versions of the nonlinear conjugate gradient method (CG), Quasi-Newton (BFGS), the limited memory Quasi-Newton (L-BFGS) [15...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • SIAM Journal on Optimization

دوره 3  شماره 

صفحات  -

تاریخ انتشار 1993